Appeared in Neural Networks 1998 Automatic Early Stopping Using Cross Validation: Quantifying the Criteria

نویسنده

  • Lutz Prechelt
چکیده

Cross validation can be used to detect when over tting starts during supervised training of a neural network; training is then stopped before convergence to avoid the overtting (\early stopping"). The exact criterion used for cross validation based early stopping, however, is chosen in an ad-hoc fashion by most researchers or training is stopped interactively. To aid a more well-founded selection of the stopping criterion, 14 di erent automatic stopping criteria from 3 classes were evaluated empirically for their e ciency and e ectiveness in 12 di erent classi cation and approximation tasks using multi layer perceptrons with RPROP training. The experiments show that on the average slower stopping criteria allow for small improvements in generalization (on the order of 4%), but cost about factor 4 longer training time.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Automatic early stopping using cross validation: quantifying the criteria

Cross validation can be used to detect when overfitting starts during supervised training of a neural network; training is then stopped before convergence to avoid the overfitting ('early stopping'). The exact criterion used for cross validation based early stopping, however, is chosen in an ad-hoc fashion by most researchers or training is stopped interactively. To aid a more well-founded sele...

متن کامل

Cluster Analysis of Neural Network Weights for Discrimination of Eeg Signals

Category: Applications. No part of this work has been submitted or appeared in other scientiic conferences. Abstract Neural networks are trained to classify half-second segments of six-channel, EEG data into one of ve classes corresponding to ve cog-nitive tasks performed by one subject. Two and three-layer feed-forward neural networks are trained using 10-fold cross-validation and early stoppi...

متن کامل

Asymptotic statistical theory of overtraining and cross-validation

A statistical theory for overtraining is proposed. The analysis treats general realizable stochastic neural networks, trained with Kullback-Leibler divergence in the asymptotic case of a large number of training examples. It is shown that the asymptotic gain in the generalization error is small if we perform early stopping, even if we have access to the optimal stopping time. Based on the cross...

متن کامل

Classification of EEG Signals from Four Subjects During Five Mental Tasks

Neural networks are trained to classify half-second segments of six-channel, EEG data into one of five classes corresponding to five cognitive tasks performed by four subjects. Two and three-layer feedforward neural networks are trained using 10-fold cross-validation and early stopping to control over-fitting. EEG signals were represented as autoregressive (AR) models. The average percentage of...

متن کامل

Development of Soft Sensor to Estimate Multiphase Flow Rates Using Neural Networks and Early Stopping

This paper proposes a soft sensor to estimate phase flow rates utilizing common measurements in oil and gas production wells. The developed system addresses the limited production monitoring due to using common metering facilities. It offers a cost-effective solution to meet real-time monitoring demands, reduces operational and maintenance costs, and acts as a back-up to multiphase flow meters....

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998